You ensure that customers see the most relevant products in the right order – balancing customer needs, business goals, and technical feasibility. We work with over 1 million real-time API calls per hour and a huge amount of data to deliver highly performant and intelligent sorting mechanisms to millions of users.
We support you on your journey: individual learning opportunities, world-wide job opportunities or technical training from our academy. The safety and well-being of our employees is important to us, which is why we set high standards for your workplace safety.
Collaborate with data scientists and engineers to break down abstract business challenges into a defined product vision, MVPs, and iterative delivery plans. Act as a bridge between technical and business teams, simplifying complex technical concepts and data insights, making them understandable for stakeholders. You’ll also translate business needs into technical requirements.
The Oberalp Group is a management driven family business - a house of brands that creates high quality technical mountaineering products. We have 6 own brands Salewa, Dynafit, Pomoca, Wild Country, Evolv and LaMunt. As an exclusive partner of other brands in the sports sector we offer our entire know how in communication, sales and image building.
The Oberalp Group is a management driven family business - a house of brands that creates high quality technical mountaineering products. We have 6 own brands Salewa, Dynafit, Pomoca, Wild Country, Evolv and LaMunt. As an exclusive partner of other brands in the sports sector we offer our entire know how in communication, sales and image building.
What You’ll Do: Collaborate in an Agile, International TeamWork closely with colleagues from Romania, Germany, and UkraineDesign, estimate, develop, and implement software solutions aligned with business needsActively communicate progress, risks, and technical decisions to stakeholdersBuild Scalable Data SolutionsDevelop agnostic data products within a modern, cloud-native data ecosystemSupport use cases across BI, Advanced Analytics, AI, and MLTranslate business requirements into robust technical architecturesContinuously enhance performance, quality, and cost-efficiency of solutionsProactively suggest improvements and best practices What makes you stand out Degree in Computer Science, Economics, or a comparable qualificationMinimum 3 years of experience as a BI Engineer or Data Engineer, focused on cloud-based architecturesStrong expertise in: Snowflake and DBT (Data Build Tool)Solid knowledge of: SQL and Data lakehouse architectures, Python is nice to haveCommunication is Key Excellent communication skills in English (written and spoken) — mandatoryAbility to clearly explain technical concepts to both technical and non-technical stakeholdersStrong stakeholder management and collaboration skillsComfortable working in cross-border, multicultural teams We are looking forward to your application and to applicants who enrich our diverse culture!
What you can expect You take on the functional and disciplinary responsibility for all FTEs in the Sales Data Hub within the federated data setup, including Data Engineers, Data Scientists, Data Governance roles and Product Owners Data.You define, design, develop, and operate cloud‑based data products for the Sales, Marketing, and Customer Service business units.You are responsible for the methodological integration of data assets and data products.You manage the Sales Data Hub operationally and further develop it as a specialized unit for data‑driven solutions in Sales, Marketing, and Customer Service.You are responsible for the further development and ongoing maintenance of all sales‑related models based on feedback and requirements from the sales organization.You lead projects related to planning, expanding, and organizing new and existing products in collaboration with the relevant business units and external partners.You assume technical responsibility for data products developed by or for the Sales, Marketing, and Customer Service areas within the Data Intelligence & Analytics team.You drive the continuous expansion, professionalization, and organizational development of the Sales Data Hub within the existing governance and organizational framework.
As part of our team, you will take on the following responsibilities: You actively develop our digital IoT solution – especially DKV LIVE firmware, hardware, and data processing – and shape the product vision for data-driven features.You dive deep into customer problems, analyze market trends, and derive product-relevant hypotheses.You validate your assumptions through structured feedback from relevant stakeholders (customers, partners, internal teams) and translate these into clear product decisions.You define actionable user stories within an agile development process and ensure they align with strategic goals.You plan sprint goals with the development team, prioritize the backlog, and maintain transparency on progress.You are part of the IoT team, working to further develop hardware components, ensure their compatibility with various vehicles, and enhance firmware stability.A key focus of your role is connecting, understanding, and mapping third-party data sources (OEM data, partner APIs, and more) so that our platform can correctly process, display, and transmit the data.You act as a bridge between technical and business stakeholders, setting clear priorities based on customer value and technical feasibility. What makes you stand out You have a technical education or a degree in IT, or you possess several years of experience in one of the following areas: IoT/Connectivity, Data Integration, Vehicle TelematicsYou have a solid understanding of IoT architectures, data flows, firmware/hardware relationships, and data interfaces.You are confident working with agile methodologies (Scrum/Kanban) and using the Atlassian toolchain (Jira/Confluence)You work analytically, customer-oriented, and communicate effectively – and you can present complex technical concepts clearly to different target audiences.You take responsibility, make clear decisions, and work in a structured and quality-conscious manner.Excellent German and English skills complete your profile.Experienced IoT or Data Engineers with a strong interest in product work (problem analysis, concept development, planning, and communication) are also explicitly invited to apply.
What you will do Gather, analyze, and interpret data to fulfill reporting requests from business stakeholders Translate complex data into clear, actionable insights that support business strategy Support ad hoc analysis and contribute to data-driven decision-making Automate routine reporting processes and maintain existing reports to ensure reliability and improve data workflows Maintain, design, and implement advanced dashboards using tools like Google Looker Studio, enabling self-service analytics across the organization Collaborate with data engineers, data scientists, and stakeholders across the organization to ensure data quality and consistency while delivering data-driven insights that support supply-related business decisions Communicate findings effectively to technical and non-technical audiences Foster a data-driven culture within the organization, promoting the use of analytics in decision-making processes Who you are You have at least 1-2 years of experience as a Data Analyst Proficiency in SQL for complex data analysis, reporting, and querying large datasets Experience with Google BigQuery Experience with data visualization tools (e.g., Looker Studio, Excel, or similar) to create compelling dashboards and reports Excellent communication skills in English Ability to translate fuzzy business requirements from diverse stakeholders into analytical requirements.
What You’ll Do Drive Measurable Business ImpactIndependently lead AI and analytics initiatives that generate tangible business valueActively contribute to and influence the company’s strategic directionApply Advanced Analytics & AIDevelop and apply advanced statistical methods in an agile environmentWork on business-critical questions using: Regression models, Time series analysis and Machine learning & AI algorithmsCollaborate cross-functionally or drive initiatives independentlyTurn Data into ActionDesign and execute analyses on large datasetsTranslate findings into clear, actionable recommendationsWork across the full spectrum — from Excel-based analysis to deep learning models Build Scalable AI SolutionsDevelop and own customized Data Science and AI solutionsWork with SQL on Azure and Snowflake platforms, Python is nice to haveLead projects end-to-end: from Proof of Concept (PoC) to fully operational production modelsCreate audience-tailored presentationsTranslate complex analytical insights into clear business languageDeliver compelling data storytelling to support decision-making at all levels What makes you stand out Minimum 3 years of experience in Data Science, AI, Advanced Analytics, or similar rolesProven ability to generate measurable business value through data-driven solutionsStrong expertise in statistical modeling and machine learning techniquesHands-on experience with: Python, SQL, Azure and SnowflakeExperience building scalable models from PoC to productionStrong communication and stakeholder management skillsAbility to explain complex topics to both technical and non-technical audiencesBachelor’s or Master’s degree in Finance, Statistics, Computer Science, Mathematics, or a related quantitative field We are looking forward to your application and to applicants who enrich our diverse culture!
You are responsible for the conceptual, logical, and structural integrity of our Core Data Model as well as the Gold Layer across Azure, Snowflake, and dbt.You ensure that fragmented data sources are transformed into consistent, reusable, and decision‑relevant data products, actively preventing the platform from drifting into team‑specific, incompatible models.You define and maintain central business objects, canonical dimensions, shared metrics, and facts, ensuring that the Core Data Model serves as a stable, business‑oriented foundation across all domains.You develop modeling standards, naming conventions, layering concepts (Staging → Intermediate → Gold), reuse patterns, and dbt design guidelines, and you ensure their consistent implementation across all teams.You safeguard the semantic consistency of the entire data model, resolve domain conflicts, ensure that identical business terms are modeled only once, and review changes affecting core layers.You act as the technical design authority for model changes in Snowflake/dbt, balancing local requirements with long‑term model coherence, and ensuring that all models remain performant, scalable, maintainable, and of high quality.
Ihre neue Herausforderung Lead internal and external development team and provide overall technical leadershipDefine, review, and continuously improve system and solution architectureTranslate business requirements into secure, scalable technical solutionsOwn Azure cloud architecture for the product, including resource management, environments, deployments, and cost controlReview, maintain, and improve InfrastructureEnsure compliance with security, data protection, and operational standardsManage releases, technical milestones, access, licenses, and system documentationDefine product vision, roadmap, annual scope, milestones, and non-functional requirementsLead technical stakeholder communication and define integration requirementsManage product development budgeting, resource planning, and forecastingOwn and prioritize the product backlog; define epics, user stories, acceptance criteria, and scope decisionsSupport sprint execution, clarify requirements, remove blockers, and accept delivered workEstablish and improve agile ways of working; facilitate planning, refinement, and retrospectivesCoach teams on Scrum, ensure delivery transparency, and drive continuous improvementConduct Team meetings and 1-on-1, ensure team well-being, and drive continuous improvement Ihre Kompetenzen Technical Expertise Experience working in enterprise organizations.Experience leading hybrid (internal + external) software development teams and providing technical direction.Strong expertise in designing, reviewing, and improving system and solution architectures.Ability to translate business needs into secure, scalable technical solutions.Solid cloud computing expertise with a focus on Azure (resources, environments, deployments, cost control).Good understanding of infrastructure, security, and compliance standards.Experience managing releases, documentation, access, and licenses.Strong capability in defining integration requirements and communicating with both technical and non‑technical stakeholders.
Mein Arbeitgeber Als erfolgreicher, wachsender mittelständischer Anbieter relevanter und innovativer daten- und KI-getriebener Lösungen zur Optimierung der Entscheidungsfindung für namhafte Unternehmen bietet dieser Arbeitgeber eine aufgeschlossene, agile und von Fortschritt geprägte Kultur, in der Teamwork, Eigenverantwortung und kontinuierliches Lernen von hoher Bedeutung sind Gestaltungsspielraum in anspruchsvollen Projekten, langfristige Perspektiven und Flexibilität sind inklusive Konzeption, Umsetzung und Weiterentwicklung von Generative-AI- und NLP-Systemen, die Kundenanforderungen in Bezug auf Performance, Latenz, Kosten und Erweiterbarkeit optimal erfüllen Regelmäßige Abstimmung mit Stakeholdern bzgl. der Anforderungen Enge Zusammenarbeit mit EntwicklerInnen und Technical Leads für die Umsetzung und Use Cases wie Retrieval-basierten Chatbots, Agentensystemen oder Fine-Tuning von Sprachmodellen Planung und Implementierung robuster Machine Learning-Pipelines nach Best Practices - auf Azure, AWS oder GCP Input bei komplexen technischen Herausforderungen sowie Präsentation von Lösungsansätzen Aktive Verfolgung neuer Entwicklungen in NLP und KI, um KundInnen stets moderne, hochwertige Lösungen anzubieten Erfolgreich abgeschlossenes Studium der (Wirtschafts-)Informatik oder eine vergleichbare Ausbildung Einschlägige Berufserfahrung in Data Science und idealerweise im Umgang mit mehreren der typischen NLP-/LLM-Tools wie OpenAI APIs, Bedrock, Azure AI Foundry, LangChain, LangGraph, Instructor, Hugging Face, Tokenizers, Vektordatenbanken, performanter Inferenz, Model Deployment, MCP/A2A und Datensatz-Erstellung Sehr gute Kenntnisse in Machine Learning und Deep Learning, insbesondere im Bezug auf Transformer-Modelle, LLMs und Generative KI Sicherer Umgang mit produktionsreifen Frameworks wie PyTorch sowie mit Agent-Frameworks wie LangGraph, SmolAgents, OpenAI Agent SDK, CrewAI oder PydanticAI Tiefes Verständnis der Model-Optimierung mit PEFT wie QLoRA, Instruction Fine-Tuning, Post-Training, Inference-Optimierung und Embeddings Sicher in Workflows wie Conversational AI, RAG, Info Extraction, Tool Calling, LLM-Evaluation Kenntnisse in Agentic RAG, GraphRAG, Multi-Agent-Systemen, Text-to-SQL und Code Retrieval Sehr gutes Verständnis für Deployments und MLOps auf Azure, GCP oder AWS Hoher Anspruch an Softwarequalität und die Fähigkeit, sauberen, performanten, skalierbaren Code zu schreiben und KI-Systeme produktiv einzusetzen Die Fähigkeit, komplexe Anforderungen in technische Lösungen zu übersetzen und auch Fachfremden sicher zu vermitteln sowie gute Deutsch- und Englischkenntnisse Eigenverantwortliches Arbeiten und Mitgestaltungsmöglichkeiten durch kurze Entscheidungswege Weiterentwicklung durch Fokus auf Innovation: Die vielfältigen und abwechslungsreichen Projekte drehen sich um die Entwicklung intelligenter Algorithmen, datenbasierter Strategien und maßgeschneiderter KI-Lösungen Engagiertes, dynamisches, konstruktives Team mit starkem Zusammenhalt und einer offenen Feedback-Kultur Sehr gut angebundene, moderne Räumlichkeiten und hochwertiges technisches Equipment Flexibel planbarer remote-Anteil von bis zu 40%, sogar zweitweise aus dem EU-Ausland Zuschuss zum Deutschland-Ticket, zu Sport- und Wellnessangeboten sowie zur Kinderbetreuung Gehaltsinformationen Erfahrungsabhängig bis zu 100.000 € p.a.
We look forward to hearing from you. Manage and refine business and technical requirements in collaboration with stakeholders Coordinate data integration activities with various source systems Design and model data structures within a Data Warehouse environment, with a strong focus on Data Vault methodology Develop and optimize data pipelines using SQL and Python Work with tools like Databricks and dbt to build scalable data transformation workflows Ensure data quality, consistency, and compliance, especially within banking-related use cases Experience in requirements management Experience in coordination with source systems Experience with data modeling in a Data Warehouse environment: Focus on Data Vault Good German and English language skills Databricks experience is nice to have Experience with dbt (data build tool) is an advantage Experience with SQL (as a query language) and Python is an advantage Banking experience is an advantage Renowned client Remote work Ihr Kontakt Ansprechpartner Florian Pracher Referenznummer 863466/1 Kontakt aufnehmen E-Mail: florian.pracher@hays.at Anstellungsart Freiberuflich für ein Projekt
Maintain and troubleshoot data integration pipelines to ensure stable data flow into AI and analytics systems Support model development by assisting with training, validation, and optimization of machine learning workflows Conduct data analysis to extract insights and provide clear reports supporting R&D research questions Solve technical challenges related to data access, pipeline performance, and software limitations Ensure continuity of ongoing projects by aligning closely with the core team and delivering on timelines Perform image analysis and prepare datasets required for scientific and ML use cases Manage and improve ETL processes to ensure data quality, structure, and availability Document workflows, pipeline changes, and analytical steps to ensure clarity and reproducibility Academic background in computer science, data science, engineering, or a related quantitative field Strong proficiency in Python with expertise in scientific and analytical libraries Skilled in SQL and working with relational databases Understanding of ETL concepts and practical experience working with data pipelines Solid foundation in machine learning principles and model lifecycle Ability to perform image analysis for scientific or research applications Strong communication and interpersonal skills with the ability to collaborate in a technical team Independent, structured problem-solver with a commitment to clear documentation and FAIR data practices Opportunity to contribute directly to active R&D projects with immediate real-world impact Hands-on involvement in AI, machine learning, and data integration challenges in a scientific environment Close collaboration with a small, highly skilled technical team Ihr Kontakt Referenznummer 863771/1 Kontakt aufnehmen Telefon:+41 44 225 50 00 E-Mail: positionen@hays.ch Anstellungsart Freiberuflich für ein Projekt
Mein Arbeitgeber Als erfolgreicher, wachsender mittelständischer Anbieter relevanter und innovativer daten- und KI-getriebener Lösungen zur Optimierung der Entscheidungsfindung für namhafte Unternehmen bietet dieser Arbeitgeber eine aufgeschlossene, agile und von Fortschritt geprägte Kultur, in der Teamwork, Eigenverantwortung und kontinuierliches Lernen von hoher Bedeutung sind Gestaltungsspielraum in anspruchsvollen Projekten, langfristige Perspektiven und Flexibilität sind inklusive Konzeption, Umsetzung und Weiterentwicklung von Generative-AI- und NLP-Systemen, die Kundenanforderungen in Bezug auf Performance, Latenz, Kosten und Erweiterbarkeit optimal erfüllenRegelmäßige Abstimmung mit Stakeholdern bzgl. der Anforderungen Enge Zusammenarbeit mit EntwicklerInnen und Technical Leads für die Umsetzung und Use Cases wie Retrieval-basierten Chatbots, Agentensystemen oder Fine-Tuning von SprachmodellenPlanung und Implementierung robuster Machine Learning-Pipelines nach Best Practices - auf Azure, AWS oder GCP Input bei komplexen technischen Herausforderungen sowie Präsentation von LösungsansätzenAktive Verfolgung neuer Entwicklungen in NLP und KI, um KundInnen stets moderne, hochwertige Lösungen anzubieten Erfolgreich abgeschlossenes Studium der (Wirtschafts-)Informatik oder eine vergleichbare AusbildungEinschlägige Berufserfahrung in Data Science und idealerweise im Umgang mit mehreren der typischen NLP-/LLM-Tools wie OpenAI APIs, Bedrock, Azure AI Foundry, LangChain, LangGraph, Instructor, Hugging Face, Tokenizers, Vektordatenbanken, performanter Inferenz, Model Deployment, MCP/A2A und Datensatz-ErstellungSehr gute Kenntnisse in Machine Learning und Deep Learning, insbesondere im Bezug auf Transformer-Modelle, LLMs und Generative KISicherer Umgang mit produktionsreifen Frameworks wie PyTorch sowie mit Agent-Frameworks wie LangGraph, SmolAgents, OpenAI Agent SDK, CrewAI oder PydanticAITiefes Verständnis der Model-Optimierung mit PEFT wie QLoRA, Instruction Fine-Tuning, Post-Training, Inference-Optimierung und Embeddings Sicher in Workflows wie Conversational AI, RAG, Info Extraction, Tool Calling, LLM-EvaluationKenntnisse in Agentic RAG, GraphRAG, Multi-Agent-Systemen, Text-to-SQL und Code RetrievalSehr gutes Verständnis für Deployments und MLOps auf Azure, GCP oder AWSHoher Anspruch an Softwarequalität und die Fähigkeit, sauberen, performanten, skalierbaren Code zu schreiben und KI-Systeme produktiv einzusetzenDie Fähigkeit, komplexe Anforderungen in technische Lösungen zu übersetzen und auch Fachfremden sicher zu vermitteln sowie gute Deutsch- und Englischkenntnisse Eigenverantwortliches Arbeiten und Mitgestaltungsmöglichkeiten durch kurze EntscheidungswegeWeiterentwicklung durch Fokus auf Innovation: Die vielfältigen und abwechslungsreichen Projekte drehen sich um die Entwicklung intelligenter Algorithmen, datenbasierter Strategien und maßgeschneiderter KI-LösungenEngagiertes, dynamisches, konstruktives Team mit starkem Zusammenhalt und einer offenen Feedback-KulturSehr gut angebundene, moderne Räumlichkeiten und hochwertiges technisches EquipmentFlexibel planbarer remote-Anteil von bis zu 40%, sogar zweitweise aus dem EU-AuslandZuschuss zum Deutschland-Ticket, zu Sport- und Wellnessangeboten sowie zur Kinderbetreuung Gehaltsinformationen Erfahrungsabhängig bis zu 100.000 € p.a.
We look forward to hearing from you. Manage and refine business and technical requirements in collaboration with stakeholdersCoordinate data integration activities with various source systemsDesign and model data structures within a Data Warehouse environment, with a strong focus on Data Vault methodologyDevelop and optimize data pipelines using SQL and PythonWork with tools like Databricks and dbt to build scalable data transformation workflowsEnsure data quality, consistency, and compliance, especially within banking-related use cases Experience in requirements managementExperience in coordination with source systemsExperience with data modeling in a Data Warehouse environment: Focus on Data VaultGood German and English language skillsDatabricks experience is nice to haveExperience with dbt (data build tool) is an advantageExperience with SQL (as a query language) and Python is an advantageBanking experience is an advantage Renowned clientRemote work Ihr Kontakt Ansprechpartner Florian Pracher Referenznummer 863466/1 Kontakt aufnehmen E-Mail: florian.pracher@hays.at Anstellungsart Freiberuflich für ein Projekt
Maintain and troubleshoot data integration pipelines to ensure stable data flow into AI and analytics systemsSupport model development by assisting with training, validation, and optimization of machine learning workflowsConduct data analysis to extract insights and provide clear reports supporting R&D research questionsSolve technical challenges related to data access, pipeline performance, and software limitationsEnsure continuity of ongoing projects by aligning closely with the core team and delivering on timelinesPerform image analysis and prepare datasets required for scientific and ML use casesManage and improve ETL processes to ensure data quality, structure, and availabilityDocument workflows, pipeline changes, and analytical steps to ensure clarity and reproducibility Academic background in computer science, data science, engineering, or a related quantitative fieldStrong proficiency in Python with expertise in scientific and analytical librariesSkilled in SQL and working with relational databasesUnderstanding of ETL concepts and practical experience working with data pipelinesSolid foundation in machine learning principles and model lifecycleAbility to perform image analysis for scientific or research applicationsStrong communication and interpersonal skills with the ability to collaborate in a technical teamIndependent, structured problem-solver with a commitment to clear documentation and FAIR data practices Opportunity to contribute directly to active R&D projects with immediate real-world impactHands-on involvement in AI, machine learning, and data integration challenges in a scientific environmentClose collaboration with a small, highly skilled technical team Ihr Kontakt Referenznummer 863771/1 Kontakt aufnehmen Telefon:+41 44 225 50 00 E-Mail: positionen@hays.ch Anstellungsart Freiberuflich für ein Projekt
Design, build, and optimize batch data pipelines for internal tool use casesDevelop efficient Spark SQL transformations for large-scale datasetsUse Python for data processing, orchestration, and automationCreate and maintain data models (facts, dimensions, aggregates) with clear grain and metric definitionsEnsure data quality and correctness, including handling late data, duplicates, and adjustmentsImplement validation, data quality checks, and reconciliation logicWork with business stakeholders to gather requirements, define metrics, and translate needs into pipelinesCollaborate with infrastructure teams on standards, performance tuning, and best practices Bachelor oder Master degree in a technical field or an equivalent qualificationExperience in data engineering or a related fieldStrong proficiency in Spark SQL for large-scale data transformationsSolid Python skills for data processing and pipeline developmentStrong understanding of data modeling (fact tables, dimensions, grain, SCDs)Hands-on experience building and maintaining batch pipelines in productionHigh attention to detail with a strong focus on data quality and metric integrityAbility to communicate clearly with non-technical stakeholders and translate business needs into data solutions Remuneration in the most attractive collective agreement in the industry Annual leave entitlement of 30 days Generous working time account with the possibility to pay overtime Subsidization of direct insurance (as company pension scheme) Ihr Kontakt Ansprechpartner Kristina Meng Referenznummer 863942/1 Kontakt aufnehmen E-Mail: kristina.meng@hays.de Anstellungsart Anstellung bei der Hays Professional Solutions GmbH
Work data-driven, based on reports and analyses, to derive actions oriented toward customer needs Manage and communicate with all relevant stakeholders Understand stakeholder demands, needs, and issues – translate them into actionable & feasible product requirements and build scalable tech solutions Triaging operative issues, identifying technical dependencies, and evaluating business impact Lead cross-team projects and collaborate closely with other tech teams to execute end-to-end solutions Who you are Master’s degree or equivalent in a relevant field 7–10 years of experience in product management for tech products within software development, focused on backend systems Proven expertise in agile methodologies and collaboration with developers, data scientists, and QA managers Strong understanding of technology, business processes, and functional dependencies Hands-on experience with e-commerce systems, ideally in retail Skilled in IT project management and leading cross-functional initiatives At least 2 years of team leadership experience Analytical, detail-oriented, and customer-focused with a proactive mindset Exceptional communication skills in English, adaptable to diverse perspectives Comfortable in international work environments, with a focus on efficiency and solutions Benefits Hybrid working Fresh fruit every day Sports courses Free access to code.talks Exclusive employee discounts Free drinks Language courses Laracast account for free Company parties Help in the relocation process Mobility subsidy State-of-the-art technology Central Location Flexible Working Hours Company pension Professional training Dog-friendly office Remote AY Academy Feedback Culture Job Bikes YOU ARE THE CORE OF ABOUT YOU.
Siamo alla ricerca di una risorsa da inserire in stage della durata di 6 mesi all’interno della divisione Footwear & Technical Equipment , a supporto del progetto di implementazione del nuovo sistema PLM Kubix Link . La persona selezionata lavorerà all’interno del team di sviluppo progetto ed entrerà in contatto diretto con i team R&D , con i fornitori di materiali e con i fornitori di prodotto finito , collaborando alla raccolta e alla strutturazione dei dati necessari per l’inserimento e la gestione delle informazioni a sistema.
Siamo alla ricerca di una risorsa da inserire in stage della durata di 6 mesi all’interno della divisione Footwear & Technical Equipment, a supporto del progetto di implementazione del nuovo sistema PLM Kubix Link. La persona selezionata lavorerà all’interno del team di sviluppo progetto ed entrerà in contatto diretto con i team R&D, con i fornitori di materiali e con i fornitori di prodotto finito, collaborando alla raccolta e alla strutturazione dei dati necessari per l’inserimento e la gestione delle informazioni a sistema.
Role Purpose As the Sales Engineer, you have a history of honing and exercising technical, quantitative, and commercial capabilities to achieve material business outcomes. You are obsessed with understanding a client’s business and the best mechanism of extending a platform to help them meet their needs.
You listen and ascertain feedback to help product and core engineering iterate on the product. You thrive in meetings -- whether a technical sync or commercial pitch -- by understanding your audience and modulating your delivery appropriately. Principal Accountabilities Primary focus is on pre-sales technical support for IQVIAs Incentive Compensation solution built on the Orchestrated Analytics ecosystem which spans MDM, data warehousing, AI/ML, and data visualisation technology solutions.
What makes you stand out You hold a degree in Business Informatics, Data Science, Computer Science, Industrial Engineering, Business Administration, or a comparable field.You bring experience in Revenue Operations, Sales Operations, Sales Analytics, or a similar commercial analytics role.You have a strong understanding of sales processes, pipeline management, forecasting, and revenue metrics, and you can translate them into technical requirements for engineers.You hold a degree in Business Informatics, Data Science, Computer Science, Industrial Engineering, Business Administration, or a comparable field.You are a great stakeholder manager who can talk to sales and engineering alike.You are highly proficient in Power BI and experienced in building dashboards that drive action.You are comfortable working with CRM data and sales systems (e.g.
. --------------------------------------------- Senior Data Scientist Data Science & Advanced Analytics Frankfurt, Germany The Senior Data Scientist is responsible for designing, deploying and continuously optimizing scalable ML solutions that translate business requirements into measurable impact across EMEA markets. The position combines technical execution with business alignment, ensuring solutions remain adaptable to evolving commercial needs while maintaining architecture consistency and standardization.
The Oberalp Group is a management driven family business - a house of brands that creates high quality technical mountaineering products. We have 6 own brands Salewa, Dynafit, Pomoca, Wild Country, Evolv and LaMunt. As an exclusive partner of other brands in the sports sector we offer our entire know how in communication, sales and image building.
The Oberalp Group is a management driven family business - a house of brands that creates high quality technical mountaineering products. We have 6 own brands Salewa, Dynafit, Pomoca, Wild Country, Evolv and LaMunt. As an exclusive partner of other brands in the sports sector we offer our entire know how in communication, sales and image building.
Why ERP matters at Lasernet Group? At Lasernet Group, ERP is not just a technical integration — it is the core context in which our customers operate. Lasernet sits between ERP systems and the compliance-critical, customer-facing communication that follows every business transaction.
What makes you stand out You have at least 5–6 years of experience as a Product Manager in digital product management.You demonstrate strong customer orientation and problem‑definition skills, independently understanding, defining, articulating, and evaluating complex customer problems and opportunities using data analyses and qualitative insights, and making targeted decisions based on them.You show excellent product judgment, balancing customer needs, business impact, and technical constraints to define, advocate for, and deliver the right product.You collaborate intensively with cross‑functional teams of engineers and designers to deliver effective solutions quickly, and you are also able to accelerate the process independently using AI.You define the roadmap for features, products, or priorities in your area and communicate what, when, and why in a way that inspires customers and the team.You feel confident working with engineers and customers on modern technologies — from APIs and SaaS infrastructure to low‑code tools and system architecture.You are results‑oriented, defining, advocating for, and communicating progress regarding the impact your product area or project has on customers and the company.You demonstrate strong leadership skills, communicate effectively, take ownership, and are able to influence and drive alignment.You speak fluent German and English.
Key Responsibilities Develop and maintain detailed project budgets, cost estimates, and financial forecasts for data center construction and infrastructure projects, tracking expenditures against approved budgetsPrepare comprehensive cost reports, cash flow projections, and value engineering analyses to optimize project costs while maintaining quality standards and technical requirementsCoordinate with project managers, contractors, and procurement teams to evaluate contract proposals, change orders, and variation requests while ensuring cost-effective project deliveryConduct risk assessments for cost implications, develop contingency strategies, and monitor market conditions affecting material and labor costs in data center constructionReview and validate contractor invoices, progress payments, and final accounts while maintaining detailed cost documentation and audit trails for financial complianceSupport procurement processes, tender evaluations, and contract negotiations to achieve optimal value while facilitating cost reconciliation, lessons learned, and knowledge transfer for future project Qualifications & Skills Bachelor's degree in Quantity Surveying, Construction Management, Engineering, or related field with several years of experience in cost management for data center or mission-critical facility projectsComprehensive knowledge of data center construction costs, industry pricing trends, and ability to interpret technical specifications for accurate cost estimation and budget developmentStrong financial analysis expertise with proficiency in cost management software, spreadsheet applications, and demonstrated ability to prepare detailed cost reports and forecastsExcellent analytical, communication, and stakeholder management skills with proven ability to negotiate with contractors and suppliers while managing cost-related project risksExperience with value engineering, life-cycle costing, and cost optimization techniques in complex construction environments with understanding of procurement processes and contract administration Jones Lang LaSalle SE Human Resources Ihr Ansprechpartner: Jan Bauermann Talent Acquisition Partner EMEA jan.bauermann@jll.com Location: On-site –Frankfurt am Main, DEU If this job description resonates with you, we encourage you to apply even if you don’t meet all of the requirements.
Your Profile: Educational Background: Completed studies in (Business) Informatics, Engineering, Mathematics, Physics, or Business Administration with a quantitative and technical focus.Relevant Experience: Minimum three years in data-driven process analysis and optimization/redesign with digital tools, including at least one year identifying and implementing AI automation solutions.
You manage the backlog, prioritize based on value, and translate epics and features into user stories including clear acceptance criteria.You ensure Definition of Ready and Definition of Done, orchestrate refinement sessions, manage dependencies and testability, and thereby enable efficient, low‑risk delivery.You ensure that requirements contribute directly to customer value, conversion, checkout efficiency, and overall KPI impact.You coordinate deliveries with internal and external development teams (without coding yourself) and ensure high implementation quality.You support the release and quality concept (regression, functional, end‑to‑end testing) and drive the further development of test automation and stable releases across connected systems.You provide management‑level reporting on status, risks, forecasts, and dependencies (e.g. quarterly, depending on governance).You systematically identify growth opportunities, collaborate closely with relevant stakeholders, bridge technical and business perspectives, and actively contribute to further professionalizing and scaling our webshop. What makes you stand out You have several years of experience as a Product Owner, (Digital) Product Manager, Business Analyst, or in a comparable role within an eCommerce or platform environment.You have a strong understanding of Agile/Scrum, refinement, sprint planning, prioritization, and stakeholder management, and are experienced in translating complex requirements into clear, testable user stories.You have a strong interest in technical concepts and are able to understand and structure system landscapes, interfaces, and data flows.Ideally, you have experience with Adobe Commerce (Magento) or comparable eCommerce platforms.Experience working with downstream systems such as CRM (e.g.
You listen and ascertain feedback to help product and core engineering iterate on the product. You thrive in meetings -- whether a technical sync or commercial pitch -- by understanding your audience and modulating your delivery appropriately. Principal Accountabilities Primary focus is on pre-sales technical support for IQVIAs Technologies Orchestrated Customer Engagement.
We are currently looking for a skilled & passionate Technical Process Owner (m/f/d) to become part of one of our Supply Tech team at ABOUT YOU! This role combines the core responsibilities of a Product Owner with a deep focus on technical process ownership.
YOUR TASKS: Design, develop and deploy digital solutions ensuring the software development life cycle in an agile setup Develop solutions on a leading-edge cloud based platform for managing and analyzing large datasets Create technical documentation Analyze and decompose business requirements into technical functionalities Produce clean and efficient code based on business requirements and specifications Create Notebooks, pipelines and workflows in SCALA or Python to ingest, process and serve data in our platform Be a technical lead for junior and external developers Be a part of the continuous improvement of Nordex’ development processes by participating in retrospectives and proposing optimizations YOUR PROFILE: Technical degree in Computer Science, Software Engineering or comparable Experience or certification in Databricks Fluent English At least 3 years of proven experience Availability to travel YOUR BENEFITS: In addition to the opportunity to make our world a little more sustainable, we offer you: *Some offers may vary by location. ** Hybrid working in accordance with the company's internal policy.
We are seeking for a talented Enterprise Architect – Tools and Monitoring to join our team. You are expected to deliver technical and architecture design and formulate standards related IT monitoring landscape (Infrastructure, Application and Business Process).
Requirements University degree in technical discipline is required (Engineering, Materials Science, Industrial Technologies, or similar). 2–4 years of experience in Product Management, Application Engineering, Technical Marketing, Innovation or Product Development within an industrial B2B environment (preferably with engineered technical products).
Working in an interdisciplinary team of engineers to develop and improve designs and manufacturing processes for thick film sensors Improve and maintain the data infrastructure and pipeline for production and process control data from various sources and ensure timely data availability Act as a technical interface between R&D and Production and between various R&D departments to harmonize data handling and standards Improve and maintain data visualization tools (dashboards, interactive charts) and support in routine data analysis Support in defining and improving image analysis methods and tools to derive quantitative feature values from images Extend the data infrastructure with additional information, e.g. from sensor performance characterization Data driven improvements of manufacturing processes Completed technical training in process engineering, data science, bioinformatics, or similar professional education Professional experience in industrial R&D or manufacturing environment, ideally in the medical device industry or a comparable regulated environment Experience in building and maintaining data pipelines (ETL processes) from diverse sources such as SQL databases, CSV, and machine log files Ability to create interactive dashboards and visualization tools with a solid understanding of applied statistics (e.g. correlation analysis, cluster analysis) to support the development teams Skills in digital image processing, object-oriented programming (OOP) in Python, and knowledge of SQL are a strong advantage, adding significant value to this opportunity Good communication skills in a multicultural and multidisciplinary environment A thorough way of working and documentation Motivated team player with passion in promoting and driving fast-paced and ambitious projects Aptitude to understand and improve the underlying technical processes Proficiency in both English and German Unlimited project contract Fascinating, innovative environment in an international atmosphere Ihr Kontakt Ansprechpartner Jannik Fabio Eichin Referenznummer 865639/1 Kontakt aufnehmen E-Mail: jannik.eichin@hays.ch Anstellungsart Freiberuflich für ein Projekt
The Oberalp Group is a management driven family business - a house of brands that creates high quality technical mountaineering products. We have 6 own brands Salewa, Dynafit, Pomoca, Wild Country, Evolv and LaMunt. As an exclusive partner of other brands in the sports sector we offer our entire know how in communication, sales and image building.
The Oberalp Group is a management driven family business - a house of brands that creates high quality technical mountaineering products. We have 6 own brands Salewa, Dynafit, Pomoca, Wild Country, Evolv and LaMunt. As an exclusive partner of other brands in the sports sector we offer our entire know how in communication, sales and image building.
Open for change and feedback is what defines our culture.We support you on your journey: individual learning opportunities, world-wide job opportunities or technical training from our academy.The safety and well-being of our employees is important to us, which is why we set high standards for your workplace safety.
Collaboration and Communication: Work closely with stakeholders, including business users, IT teams, and external partners. Communicate technical information to non-technical stakeholders, ensuring clear understanding of application capabilities and limitations. Documentation and Knowledge Management: Develop and maintain documentation for applications, including design documents, user guides, and technical notes.
Ruhr: Strategic, analytical mindset with strong data quality focus Independent, ownership driven working style Clear and structured documentation skills Mentoring and knowledge sharing mindset At least six years' experience in implementation projects, of which at least two years must have been spent in thematically comparable IT projects, Special knowledge in several technical areas, PM: At least two years' experience in IT project management with full management responsibility for the project staff, Management of Cat.
Working in an interdisciplinary team of engineers to develop and improve designs and manufacturing processes for thick film sensorsImprove and maintain the data infrastructure and pipeline for production and process control data from various sources and ensure timely data availabilityAct as a technical interface between R&D and Production and between various R&D departments to harmonize data handling and standardsImprove and maintain data visualization tools (dashboards, interactive charts) and support in routine data analysisSupport in defining and improving image analysis methods and tools to derive quantitative feature values from imagesExtend the data infrastructure with additional information, e.g. from sensor performance characterizationData driven improvements of manufacturing processes Completed technical training in process engineering, data science, bioinformatics, or similar professional educationProfessional experience in industrial R&D or manufacturing environment, ideally in the medical device industry or a comparable regulated environmentExperience in building and maintaining data pipelines (ETL processes) from diverse sources such as SQL databases, CSV, and machine log filesAbility to create interactive dashboards and visualization tools with a solid understanding of applied statistics (e.g. correlation analysis, cluster analysis) to support the development teamsSkills in digital image processing, object-oriented programming (OOP) in Python, and knowledge of SQL are a strong advantage, adding significant value to this opportunityGood communication skills in a multicultural and multidisciplinary environment A thorough way of working and documentationMotivated team player with passion in promoting and driving fast-paced and ambitious projectsAptitude to understand and improve the underlying technical processesProficiency in both English and German Unlimited project contractFascinating, innovative environment in an international atmosphere Ihr Kontakt Ansprechpartner Jannik Fabio Eichin Referenznummer 865639/1 Kontakt aufnehmen E-Mail: jannik.eichin@hays.ch Anstellungsart Freiberuflich für ein Projekt
Design, develop and deploy digital solutions ensuring the software development life cycle in an agile setup. Create technical documentation. Analyze and decompose business requirements into technical functionalities. Produce clean and efficient code based on business requirements and specifications.
Support global sales teams with technical and commercial expertise during tenders and negotiations. Develop business cases and proposals for new service products and drive decision‑making processes.
The Oberalp Group is a management driven family business - a house of brands that creates high quality technical mountaineering products. We have 6 own brands Salewa, Dynafit, Pomoca, Wild Country, Evolv and LaMunt. As an exclusive partner of other brands in the sports sector we offer our entire know how in communication, sales and image building.
The Oberalp Group is a management driven family business - a house of brands that creates high quality technical mountaineering products. We have 6 own brands Salewa, Dynafit, Pomoca, Wild Country, Evolv and LaMunt. As an exclusive partner of other brands in the sports sector we offer our entire know how in communication, sales and image building.
CADENAS GmbH sucht in Augsburg eine/n Product Manager/in (m/w/d) - Technical (ID-Nummer: 13572099)